The Context-tree Weighting Method: Extensions - Information Theory, IEEE Transactions on
نویسنده
چکیده
First we modify the basic (binary) context-tree weighting method such that the past symbols x1 D; x2 D; ; x0 are not needed by the encoder and the decoder. Then we describe how to make the context-tree depth D infinite, which results in optimal redundancy behavior for all tree sources, while the number of records in the context tree is not larger than 2T 1: Here T is the length of the source sequence. For this extended context-tree weighting algorithm we show that with probability one the compression ratio is not larger than the source entropy for source sequence length T !1 for stationary and ergodic sources.
منابع مشابه
The Context-Tree Weighting Method : Extensions
First we modify the basic (binary) context-tree weighting method such that the past symbols x1 D; x2 D; ; x0 are not needed by the encoder and the decoder. Then we describe how to make the context-tree depth D infinite, which results in optimal redundancy behavior for all tree sources, while the number of records in the context tree is not larger than 2T 1: Here T is the length of the source se...
متن کاملReflections on “The Context-Tree Weighting Method: Basic Properties”
Copyright c ©1997 IEEE. Reprinted from IEEE Information Theory Society Newsletter, Vol. 47, No. 1, March 1997. This material is posted here with permission of the IEEE. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obt...
متن کاملThe context-tree weighting method: basic properties
We describe a sequential universal data compression procedure for binary tree sources that performs the " double mixture. " Using a context tree, this method weights in an efficient recursive way the coding distributions corresponding to all bounded memory tree sources, and achieves a desirable coding distribution for tree sources with an unknown model and unknown parameters. Computational and ...
متن کاملAnalysis of a complexity-based pruning scheme for classification trees
A complexity based pruning procedure for classification trees is described, and bounds on its finite sample performance are established. The procedure selects a subtree of a (possibly random) initial tree in order to minimize a complexity penalized measure of empirical risk. The complexity assigned to a subtree is proportional to the square root of its size. Two cases are considered. In the fir...
متن کاملContext weighting for general finite-context sources
Context weighting procedures are presented for sources with models (structures) in four diierent classes. Although the procedures are designed for universal data compression purposes, their generality allows application in the area of classiication.
متن کامل